Skip to content

add CLI parameters vllm-api-base vllm-api-key#99

Open
milahu wants to merge 2 commits intodatalab-to:masterfrom
milahu:add-cli-params
Open

add CLI parameters vllm-api-base vllm-api-key#99
milahu wants to merge 2 commits intodatalab-to:masterfrom
milahu:add-cli-params

Conversation

@milahu
Copy link
Copy Markdown

@milahu milahu commented May 3, 2026

make it work with lm-studio

i prefer lm-studio over chandra_vllm
because docker container is overkill

this should also work with other OpenAI-compatible backends

usage

  1. start lm-studio
  2. load the chandra-ocr-2 model
  3. enable local server: developer → start server
  4. run chandra
chandra test.png test.png.chandra \
  --method vllm \
  --vllm-api-base http://127.0.0.1:1234/v1 \
  --vllm-api-key lm-studio

error handling

a wrong vllm-api-base like http://127.0.0.1:1234
was giving the confusing error

raw = completion.choices[0].message.content
          ~~~~~~~~~~~~~~~~~~^^^
TypeError: 'NoneType' object is not subscriptable

so now this honors completion.error
and it shows a warning if vllm-api-base does not end with /v1

AVIF image format

fix #98

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

add support for AVIF image format

1 participant